Stochastic Optimization with Importance Sampling
نویسندگان
چکیده
Uniform sampling of training data has been commonly used in traditional stochastic optimization algorithms such as Proximal Stochastic Gradient Descent (prox-SGD) and Proximal Stochastic Dual Coordinate Ascent (prox-SDCA). Although uniform sampling can guarantee that the sampled stochastic quantity is an unbiased estimate of the corresponding true quantity, the resulting estimator may have a rather high variance, which negatively affects the convergence of the underlying optimization procedure. In this paper we study stochastic optimization with importance sampling, which improves the convergence rate by reducing the stochastic variance. Specifically, we study prox-SGD with importance sampling and prox-SDCA with importance sampling. For prox-SGD, instead of adopting uniform sampling throughout the training process, the proposed algorithm employs importance sampling to minimize the variance of the stochastic gradient. For prox-SDCA, the proposed importance sampling scheme aims to achieve higher expected dual value at each coordinate ascent step. We provide extensive theoretical analysis to show that the convergence rates with the proposed importance sampling methods can be significantly improved under suitable conditions both for prox-SGD and for prox-SDCA. Experiments are provided to verify the theoretical analysis.
منابع مشابه
Stochastic Optimization with Importance Sampling for Regularized Loss Minimization
Uniform sampling of training data has been commonly used in traditional stochastic optimization algorithms such as Proximal Stochastic Mirror Descent (prox-SMD) and Proximal Stochastic Dual Coordinate Ascent (prox-SDCA). Although uniform sampling can guarantee that the sampled stochastic quantity is an unbiased estimate of the corresponding true quantity, the resulting estimator may have a rath...
متن کاملFaster Optimization through Adaptive Importance Sampling
The current state of the art stochastic optimization algorithms (SGD, SVRG, SCD, SDCA, etc.) are based on sampling one active datapoint uniformly at random in each iteration. Changing these probabilities to better reflect the importance of each datapoint is a natural and powerful idea. In this thesis we analyze Stochastic Coordinate Descent methods with fixed non-uniform and adaptive sampling. ...
متن کاملImportance Sampling in Stochastic Programming: A Markov Chain Monte Carlo Approach
Stochastic programming models are large-scale optimization problems that are used to facilitate decisionmaking under uncertainty. Optimization algorithms for such problems need to evaluate the expected future costs of current decisions, often referred to as the recourse function. In practice, this calculation is computationally difficult as it requires the evaluation of a multidimensional integ...
متن کاملAdaptive Sample Size and Importance Sampling in Estimation-based Local Search for Stochastic Combinatorial Optimization: A complete analysis
Metaheuristics and local search algorithms have received considerable attention as promising methods for tackling stochastic combinatorial optimization problems. However, in stochastic settings, these algorithms are usually simple extensions of the versions that are originally designed for deterministic optimization and often they lack rigorous integration with techniques that handle the stocha...
متن کاملNonlinear Stochastic Optimization by the Monte-Carlo Method
Methods for solving stochastic optimization problems by Monte-Carlo simulation are considered. The stoping and accuracy of the solutions is treated in a statistical manner, testing the hypothesis of optimality according to statistical criteria. A rule for adjusting the Monte-Carlo sample size is introduced to ensure the convergence and to find the solution of the stochastic optimization problem...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1401.2753 شماره
صفحات -
تاریخ انتشار 2014